221 research outputs found

    Recent progress on flat plate solar collectors equipped with nanofluid and turbulator: state of the art.

    Get PDF
    This paper reviews the impacts of employing inserts, nanofluids, and their combinations on the thermal performance of flat plate solar collectors. The present work outlines the new studies on this specific kind of solar collector. In particular, the influential factors upon operation of flat plate solar collectors with nanofluids are investigated. These include the type of nanoparticle, kind of base fluid, volume fraction of nanoparticles, and thermal efficiency. According to the reports, most of the employed nanofluids in the flat plate solar collectors include Al2O3, CuO, and TiO2. Moreover, 62.34%, 16.88%, and 11.26% of the utilized nanofluids have volume fractions between 0 and 0.5%, 0.5 and 1%, and 1 and 2%, respectively. The twisted tape is the most widely employed of various inserts, with a share of about one-third. Furthermore, the highest achieved flat plate solar collectors' thermal efficiency with turbulator is about 86.5%. The review is closed with a discussion about the recent analyses on the simultaneous use of nanofluids and various inserts in flat plate solar collectors. According to the review of works containing nanofluid and turbulator, it has been determined that the maximum efficiency of about 84.85% can be obtained from a flat plate solar collector. It has also been observed that very few works have been done on the combination of two methods of employing nanofluid and turbulator in the flat plate solar collector, and more detailed work can still be done, using more diverse nanofluids (both single and hybrid types) and turbulators with more efficient geometries

    Elucidation on the Effect of Operating Temperature to the Transport Properties of Polymeric Membrane Using Molecular Simulation Tool

    Get PDF
    Existing reports of gas transport properties within polymeric membrane as a direct consequence of operating temperature are in a small number and have arrived in diverging conclusion. The scarcity has been associated to challenges in fabricating defect free membranes and empirical investigations of gas permeation performance at the laboratory scale that are often time consuming and costly. Molecular simulation has been proposed as a feasible alternative of experimentally studied materials to provide insights into gas transport characteristic. Hence, a sequence of molecular modelling procedures has been proposed to simulate polymeric membranes at varying operating temperatures in order to elucidate its effect to gas transport behaviour. The simulation model has been validated with experimental data through satisfactory agreement. Solubility has shown a decrement in value when increased in temperature (an average factor of 1.78), while the opposite has been observed for gas diffusivity (an average factor of 1.32) when the temperature is increased from 298.15Â K to 323.15Â K. In addition, it is found that permeability decreases by 1.36 times as the temperature is increased

    Proxy Re-Encryption and Re-Signatures from Lattices

    Get PDF
    Proxy re-encryption (PRE) and Proxy re-signature (PRS) were introduced by Blaze, Bleumer and Strauss [Eurocrypt \u2798]. Basically, PRE allows a semi-trusted proxy to transform a ciphertext encrypted under one key into an encryption of the same plaintext under another key, without revealing the underlying plaintext. Since then, many interesting applications have been explored, and many constructions in various settings have been proposed, while PRS allows a semi-trusted proxy to transform Alice\u27s signature on a message into Bob\u27s signature on the same message, but the proxy cannot produce new valid signature on new messages for either Alice or Bob. Recently, for PRE related progress, Cannetti and Honhenberger [CCS \u2707] defined a stronger notion -- CCA-security and construct a bi-directional PRE scheme. Later on, several work considered CCA-secure PRE based on bilinear group assumptions. Very recently, Kirshanova [PKC \u2714] proposed the first single-hop CCA1-secure PRE scheme based on learning with errors (LWE) assumption. For PRS related progress, Ateniese and Hohenberger [CCS\u2705] formalized this primitive and provided efficient constructions in the random oracle model. At CCS 2008, Libert and Vergnaud presented the first multi-hop uni-directional proxy re-signature scheme in the standard model, using assumptions in bilinear groups. In this work, we first point out a subtle but serious mistake in the security proof of the work by Kirshanova. This reopens the direction of lattice-based CCA1-secure constructions, even in the single-hop setting. Then we construct a single-hop PRE scheme that is proven secure in our new tag-based CCA-PRE model. Next, we construct the first multi-hop PRE construction. Lastly, we also construct the first PRS scheme from lattices that is proved secure in our proposed unified security mode

    Learning Tversky Similarity

    Full text link
    In this paper, we advocate Tversky's ratio model as an appropriate basis for computational approaches to semantic similarity, that is, the comparison of objects such as images in a semantically meaningful way. We consider the problem of learning Tversky similarity measures from suitable training data indicating whether two objects tend to be similar or dissimilar. Experimentally, we evaluate our approach to similarity learning on two image datasets, showing that is performs very well compared to existing methods

    Automated Analysis of Chest Radiographs for Cystic Fibrosis Scoring

    Get PDF
    We present a framework to analyze chest radiographs for cystic fibro-sis using machine learning methods. We compare the representational power of deep learning features with traditional texture features. Specifically, we respec-tively employ VGG-16 based deep learning features, Tamura and Gabor filter based textural features to represent the cystic fibrosis images. We demonstrate that VGG-16 features perform best, with a maximum agreement of 82%. In ad-dition, due to limited dimensionality, Tamura features for unsegmented images achieve no more than 50% agreement; however, after segmentation, the accuracy of Tamura can reach 78%. In combination with using the deep learning features, we also compare back propagation neural network and sparse coding classifiers to the typical SVM classifier with polynomial kernel function. The result shows that neural network and sparse coding classifiers outperform SVM in most cases. Only with insufficient training samples does SVM demonstrate higher accuracy

    Efficiently Obfuscating Re-Encryption Program under DDH Assumption

    Get PDF
    A re-encryption program (or a circuit) transforms a ciphertext encrypted under Alice\u27s public key pk1pk_1 to a ciphertext of the same message encrypted under Bob\u27s public key pk2pk_2. Hohenberger et al. (TCC 2007) constructed a pairing-based obfuscator for a family of circuits implementing the re-encryption functionality under a new notion of obfuscation called as \textit{average-case secure obfuscation}. Chandran et al. (PKC 2014) proposed a lattice-based construction for the same. The construction given by Hohenberger et al. could only support encryptions of messages from a polynomial space and the decryption algorithm may have to perform a polynomial number of pairing operations in the worst case. Moreover, the proof of security relies on strong assumptions. On the other hand, the construction given by Chandran et al. relies on standard assumptions on lattices but could only satisfy a relaxed notion of correctness. In this work we propose a simple and efficient obfuscator for the re-encryption functionality which doesn\u27t suffer from \textit{any} of the above mentioned drawbacks. In particular, our construction satisfies the strongest notion of correctness, supports encryption of messages from an exponential sized domain and relies on the standard DDH-assumption. We also strengthen the black-box security model for encryption - re-encryption system proposed by Hohenberger et al. and prove the average-case virtual black box property of our obfuscator as well as the security of our encryption - re-encryption system (in the strengthened model) under the DDH-assumption. All our proofs are in the standard model

    CARIBE: Cascaded IBE for Maximum Flexibility and User-side Control

    Get PDF
    Mass surveillance and a lack of end-user encryption, coupled with a growing demand for key escrow under legal oversight and certificate authority security concerns, raise the question of the appropriateness of continued general dependency on PKI. Under this context, we examine Identity-Based Encryption (IBE) as an alternative to public-key encryption. Cascade encryption, or sequential multiple encryption, is the concept of layering encryption such that the ciphertext from one encryption step is the plaintext of the next. We describe CARIBE, a cascaded IBE scheme, for which we also provide a cascaded CCA security experiment, IND-ID-C.CCA, and prove its security in the computational model. CARIBE combines the ease-of-use of IBE with key escrow, limited to the case when the entire set of participating PKGs collaborate. Furthermore, we describe a particular CARIBE scheme, CARIBE-S, where the receiver is a self-PKG – one of the several PKGs included in the cascade. CARIBE-S inherits IND-ID-C.CCA from CARIBE, and avoids key escrow entirely. In essence, CARIBE-S offers the maximum flexibility of the IBE paradigm and gives the users complete control without the key escrow problem

    Registration-Based Encryption: Removing Private-Key Generator from IBE

    Get PDF
    In this work, we introduce the notion of registration-based encryption (RBE for short) with the goal of removing the trust parties need to place in the private-key generator in an IBE scheme. In an RBE scheme, users sample their own public and secret keys. There will also be a ``key curator\u27\u27 whose job is only to aggregate the public keys of all the registered users and update the short public parameter whenever a new user joins the system. Encryption can still be performed to a particular ecipient using the recipient\u27s identity and any public parameters released subsequent to the recipient\u27s registration. Decryption requires some auxiliary information connecting users\u27 public (and secret) keys to the public parameters. Because of this, as the public parameters get updated, a decryptor may need to obtain a few additional auxiliary information for decryption. More formally, if nn is the total number of identities and κ\kappa is the security parameter, we require the following. Efficiency requirements: (1) A decryptor only needs to obtain updated auxiliary information for decryption at most O(logn)O(\log n) times in its lifetime, (2) each of these updates are computed by the key curator in time poly(κ,logn)poly(\kappa,\log n), and (3) the key curator updates the public parameter upon the registration of a new party in time poly(κ,logn)poly(\kappa,\log n). Properties (2) and (3) require the key curator to have \emph{random} access to its data. Compactness requirements: (1) Public parameters are always at most poly(κ,logn)poly(\kappa,\log n) bit, and (2) the total size of updates a user ever needs for decryption is also at most poly(κ,logn)poly(\kappa,\log n) bits. We present feasibility results for constructions of RBE based on indistinguishably obfuscation. We further provide constructions of \emph{weakly efficient} RBE, in which the registration step is done in poly(κ,n)poly(\kappa, n), based on CDH, Factoring or LWE assumptions. Note that registration is done only once per identity, and the more frequent operation of generating updates for a user, which can happen more times, still runs in time poly(κ,logn)poly(\kappa,\log n). We leave open the problem of obtaining standard RBE (with poly(κ,logn)poly(\kappa,\log n) registration time) from standard assumptions
    corecore